The Best 474 Question Answering System Tools in 2025
Distilbert Base Cased Distilled Squad
Apache-2.0
DistilBERT is a lightweight distilled version of BERT, with 40% fewer parameters, 60% faster speed, while retaining over 95% performance. This model is a question-answering specialized version fine-tuned on the SQuAD v1.1 dataset.
Question Answering System English
D
distilbert
220.76k
244
Distilbert Base Uncased Distilled Squad
Apache-2.0
DistilBERT is a lightweight distilled version of BERT, with 40% fewer parameters and 60% faster speed, maintaining over 95% of BERT's performance on the GLUE benchmark. This model is fine-tuned specifically for question answering tasks.
Question Answering System
Transformers English

D
distilbert
154.39k
115
Tapas Large Finetuned Wtq
Apache-2.0
TAPAS is a table question answering model based on the BERT architecture, pre-trained in a self-supervised manner on Wikipedia table data, supporting natural language question answering on table content
Question Answering System
Transformers English

T
google
124.85k
141
T5 Base Question Generator
A question generation model based on t5-base. Input the answer and context, and output the corresponding question.
Question Answering System
Transformers

T
iarfmoose
122.74k
57
Bert Base Cased Qa Evaluator
A BERT-base-cased based QA pair evaluation model for determining semantic relevance between questions and answers
Question Answering System
B
iarfmoose
122.54k
9
Tiny Doc Qa Vision Encoder Decoder
MIT
A document Q&A model based on the MIT License, primarily for testing purposes.
Question Answering System
Transformers

T
fxmarty
41.08k
16
Dpr Question Encoder Single Nq Base
DPR (Dense Passage Retrieval) is a tool and model for open-domain question answering research. This model is a BERT-based question encoder trained on the Natural Questions (NQ) dataset.
Question Answering System
Transformers English

D
facebook
32.90k
30
Mobilebert Uncased Squad V2
MIT
MobileBERT is a lightweight version of BERT_LARGE, fine-tuned on the SQuAD2.0 dataset for question answering systems.
Question Answering System
Transformers English

M
csarron
29.11k
7
Tapas Base Finetuned Wtq
Apache-2.0
TAPAS is a Transformer-based table question answering model, pre-trained on Wikipedia table data through self-supervised learning and fine-tuned on datasets like WTQ.
Question Answering System
Transformers English

T
google
23.03k
217
Dpr Question Encoder Multiset Base
BERT-based Dense Passage Retrieval (DPR) question encoder for open-domain QA research, trained on multiple QA datasets
Question Answering System
Transformers English

D
facebook
17.51k
4
Roberta Base On Cuad
MIT
A model fine-tuned on the RoBERTa-base for legal contract Q&A tasks, specifically designed for contract review
Question Answering System
Transformers English

R
Rakib
14.79k
8
Mdeberta V3 Base Squad2
MIT
A multilingual Q&A model based on mDeBERTa-v3-base, fine-tuned on the SQuAD2.0 dataset
Question Answering System
Transformers Supports Multiple Languages

M
timpal0l
14.06k
246
T5 Base Qg Hl
MIT
A T5-base architecture-trained answer-aware question generation model capable of generating corresponding questions based on highlighted answer segments in the text.
Question Answering System
Transformers

T
valhalla
11.65k
12
T5 Base E2e Qg
MIT
An end-to-end question generation model trained on T5-base, capable of automatically generating relevant questions from input text.
Question Answering System
Transformers

T
valhalla
10.58k
29
Bert Large Finetuned Squad2
A question-answering model based on the bert-large-uncased architecture and fine-tuned on the SQuAD2.0 dataset
Question Answering System English
B
phiyodr
9,892
0
Electra Large Discriminator Squad2 512
This is a large-scale discriminator model based on the ELECTRA architecture, specifically fine-tuned for Q&A tasks on the SQuAD2.0 dataset, capable of handling both answerable and unanswerable question scenarios.
Question Answering System
Transformers

E
ahotrod
8,925
6
Distilbert Onnx
Apache-2.0
This is a question-answering model fine-tuned on the SQuAD v1.1 dataset using knowledge distillation techniques, based on the DistilBERT-base-cased model.
Question Answering System
Transformers English

D
philschmid
8,650
2
Biobert V1.1 Biomedicalquestionanswering
A biomedical domain Q&A model fine-tuned based on BioBERT-v1.1
Question Answering System
Transformers

B
Shushant
7,145
6
Tapas Base Finetuned Tabfact
Apache-2.0
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data. It is pre-trained in a self-supervised manner on English Wikipedia table data and fine-tuned on the TabFact dataset to determine whether a sentence is supported or refuted by table content.
Question Answering System
Transformers English

T
google
6,669
1
T5 Base Finetuned Question Generation Ap
Apache-2.0
This model is based on the T5-base architecture and fine-tuned on the SQuAD v1.1 dataset, specifically designed for question generation tasks.
Question Answering System
Transformers English

T
mrm8488
6,562
109
Bert Tiny Finetuned Squadv2
A compact Q&A model fine-tuned on the SQuAD2.0 dataset based on Google's BERT-Tiny architecture, with a size of only 16.74MB
Question Answering System English
B
mrm8488
6,327
1
Qnli Electra Base
Apache-2.0
This is a cross-encoder model based on the ELECTRA architecture, specifically designed for natural language inference (NLI) in question-answering tasks, determining whether a given question can be answered by a specific paragraph.
Question Answering System
Transformers English

Q
cross-encoder
6,172
3
Albert Base V2 Squad2
A Q&A model fine-tuned on the SQuAD v2 dataset based on the ALBERT base v2 architecture, excelling in reading comprehension tasks involving unanswerable questions
Question Answering System
Transformers

A
twmkn9
4,152
4
Tapas Temporary Repo
Apache-2.0
TAPAS is a table-based question answering model that handles conversational QA tasks on tabular data through pre-training and fine-tuning.
Question Answering System
Transformers English

T
lysandre
3,443
0
Wmt22 Cometkiwi Da
COMETKiwi is a model for machine translation quality estimation, capable of outputting quality scores based on source text and translated text.
Question Answering System Supports Multiple Languages
W
Unbabel
3,104
38
Splinter Base Qass
Apache-2.0
Splinter is a few-shot QA model pre-trained via self-supervised learning, utilizing the Recurrent Span Selection (RSS) objective to simulate the span selection process in extractive QA.
Question Answering System
Transformers English

S
tau
3,048
1
Bert Base Cased Squad V1.1 Portuguese
MIT
A Portuguese question-answering model fine-tuned on the Portuguese SQUAD v1.1 dataset, based on BERTimbau base, suitable for Portuguese QA tasks.
Question Answering System Other
B
pierreguillou
3,041
36
Dictalm2 It Qa Fine Tune
Apache-2.0
This is a fine-tuned version of Dicta - IL's dictalm2.0 - instruct model, specifically designed for generating Hebrew question-answer pairs.
Question Answering System
Transformers Other

D
618AI
2,900
6
Roberta Base Chinese Extractive Qa
A Chinese extractive QA model based on the RoBERTa architecture, suitable for tasks that extract answers from given texts.
Question Answering System Chinese
R
uer
2,694
98
Tapex Large Finetuned Wtq
MIT
TAPEX is a table pre-training model that learns via neural SQL executor, based on the BART architecture, specifically designed for table reasoning tasks.
Question Answering System
Transformers English

T
microsoft
2,431
74
Flan T5 Base Squad2
MIT
An extractive QA model fine-tuned on the SQuAD2.0 dataset based on flan-t5-base, capable of handling question-answer pairs including unanswerable questions.
Question Answering System
Transformers English

F
sjrhuschlee
2,425
4
Tapas Tiny Finetuned Sqa
Apache-2.0
TAPAS is a QA model for tabular data. This tiny version is fine-tuned on the SQA dataset, suitable for table-based QA tasks in conversational scenarios.
Question Answering System
Transformers English

T
google
2,391
0
Dynamic Tinybert
Apache-2.0
Dynamic-TinyBERT is an efficient question answering model that improves inference efficiency through dynamic sequence length reduction, achieving up to 3.3x speedup while maintaining high accuracy.
Question Answering System
Transformers English

D
Intel
2,184
78
Distill Bert Base Spanish Wwm Cased Finetuned Spa Squad2 Es
Apache-2.0
A Spanish Q&A model optimized via distillation from BETO, more lightweight and efficient than the standard version
Question Answering System Spanish
D
mrm8488
2,145
48
Biobert V1.1 Pubmed Squad V2
A question-answering model fine-tuned on the SQuAD V2 dataset based on BioBERT v1.1 Pubmed, specifically designed for biomedical QA tasks
Question Answering System
B
ktrapeznikov
2,127
3
Tapas Tiny Finetuned Wtq
Apache-2.0
TAPAS is a tiny Transformer model optimized for table question answering tasks, achieving table comprehension capabilities through intermediate pretraining and chained multi-dataset fine-tuning
Question Answering System
Transformers English

T
google
1,894
1
Bert Base Uncased Squad V1
MIT
A Q&A system model fine-tuned on the SQuAD1.1 dataset based on BERT-base uncased model
Question Answering System English
B
csarron
1,893
13
Tapas Base Finetuned Sqa
Apache-2.0
A table question answering model based on BERT architecture, enhanced with intermediate pretraining for numerical reasoning, fine-tuned on the SQA dataset
Question Answering System
Transformers English

T
google
1,867
6
Question Answering Roberta Base S V2
Apache-2.0
A RoBERTa-based question answering model specialized in inferring answer text, span, and confidence scores given a question and context.
Question Answering System
Transformers

Q
consciousAI
1,832
10
Xlm Roberta Large Qa Multilingual Finedtuned Ru
Apache-2.0
This is a pretrained model based on the XLM-RoBERTa architecture, trained with masked language modeling objectives and fine-tuned on English and Russian question answering datasets.
Question Answering System
Transformers Supports Multiple Languages

X
AlexKay
1,814
48
Camembert Base Squadfr Fquad Piaf
A French Q&A model based on CamemBERT, fine-tuned on three French Q&A datasets: PIAF, FQuAD, and SQuAD-FR
Question Answering System
Transformers French

C
AgentPublic
1,789
28
Distilbert Base Uncased Distilled Squad Int8 Static Inc
Apache-2.0
This is the INT8 quantized version of the DistilBERT base uncased model, specifically designed for question answering tasks, optimized for model size and inference speed through post-training static quantization.
Question Answering System
Transformers

D
Intel
1,737
4
- 1
- 2
- 3
- 4
- 5
- 6
- 10